Improving Penalized Least Squares through Adaptive Selection of Penalty and Shrinkage

نویسنده

  • RUDOLF BERAN
چکیده

Estimation of the mean function in nonparametric regression is usefully separated into estimating the means at the observed factor levels—a one-way layout problem— and interpolation between the estimated means at adjacent factor levels. Candidate penalized least squares (PLS) estimators for the mean vector of a one-way layout are expressed as shrinkage estimators relative to an orthogonal regression basis determined by the penalty matrix. The shrinkage representation of PLS suggests a larger class of candidate monotone shrinkage (MS) estimators. Adaptive PLS and MS estimators choose the shrinkage vector and penalty matrix to minimize estimated risk. The actual risks of shrinkage-adaptive estimators depend strongly upon the economy of the penalty basis in representing the unknown mean vector. Local annihilators of polynomials, among them difference operators, generate penalty bases that are economical in a range of examples. Diagnostic techniques for adaptive PLS or MS estimators include basis-economy plots and estimates of loss or risk.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Hybrid Shrinkage Estimators Using Penalty Bases for the Ordinal One-way Layout

This paper constructs improved estimators of the means in the Gaussian saturated one-way layout with an ordinal factor. The least squares estimator for the mean vector in this saturated model is usually inadmissible. The hybrid shrinkage estimators of this paper exploit the possibility of slow variation in the dependence of the means on the ordered factor levels but do not assume it and respond...

متن کامل

Simultaneous regression shrinkage, variable selection, and supervised clustering of predictors with OSCAR.

Variable selection can be challenging, particularly in situations with a large number of predictors with possibly high correlations, such as gene expression data. In this article, a new method called the OSCAR (octagonal shrinkage and clustering algorithm for regression) is proposed to simultaneously select variables while grouping them into predictive clusters. In addition to improving predict...

متن کامل

Adaptive Estimation of Directional Trend

Consider a one-way layout with one directional observation per factor level. Each observed direction is a unit vector in R measured with random error. Information accompanying the measurements suggests that the mean directions, normalized to unit length, follow a trend: the factor levels are ordinal and mean directions at nearby factor levels may be close. Measured positions of the paleomagneti...

متن کامل

Simultaneous regression shrinkage, variable selection and clustering of predictors with OSCAR

In this paper, a new method called the OSCAR (Octagonal Shrinkage and Clustering Algorithm for Regression) is proposed to simultaneously select variables and perform supervised clustering in the context of linear regression. The technique is based on penalized least squares with a geometrically intuitive penalty function that, like the LASSO penalty, shrinks some coefficients to exactly zero. A...

متن کامل

Robust Regression through the Huber’s criterion and adaptive lasso penalty

The Huber’s Criterion is a useful method for robust regression. The adaptive least absolute shrinkage and selection operator (lasso) is a popular technique for simultaneous estimation and variable selection. The adaptive weights in the adaptive lasso allow to have the oracle properties. In this paper we propose to combine the Huber’s criterion and adaptive penalty as lasso. This regression tech...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2001